Data Center Tours: Iron Mountain BOS-1, Northborough, MA

Oct. 23, 2023
The Iron Mountain BOS-1 Data Center is a colocation facility with 21,000 SF of usable floor space and 3.6 MW of power, with room to expand on its 58-acre campus. This article is the first in a series recounting DCF's Fall 2023 data center tours.

I pull up to the Iron Mountain BOS-1 Data Center just west of Boston and step out of my truck. In a mood to take care of business, I immediately snap a few photos of the building.

A man wearing a blue Iron Mountain uniform shirt quickly steps outside and walks over to inquire of me. I tell him I'm from Data Center Frontier, here for a tour. We shake hands, then walk inside where Jason Brunetti, Data Center Operations Manager for Iron Mountain Data Centers, meets me in the lobby on a crisp fall day.

 "We've got 24 hours security, and they're our guys," explains Brunetti. "They're not like a rental security situation. These guys work for Iron Mountain, they pass all the same background checks that we do, and it's the same check that we ask our customers to pass as well."

The Iron Mountain BOS-1 Data Center is a colocation facility with 21,000 sq. ft. of usable floor space and 3.6 MW of power, with significant potential to scale up on its 58-acre campus.

Rated to withstand Category 4 hurricane winds, the compact BOS-1 sits along a quiet, widely spaced technology drive at 171 Bearfoot Road, in the Boston suburb of Northborough, Massachusetts in Worcester County.

As a ten-year old colocation facility in the New England area, "it's a little different," says Brunetti.

"We built small because the power is very expensive up here," he says. "Generally, if a customer comes to us or one of our competitors, they're going to say, we have 40 cabinets, but it's really expensive to put 40 cabinets here. But we want to touch ten of them on a monthly basis. So they'll install ten here, and then take another 30 to another data center where power is a lot cheaper. Plus we've got some customers that have an entire data center right here."

"We've got enough land here to create another two data centers just like we've got," Brunetti adds. I ask a nearly rhetorical question next, which is whether that's definitely going to happen. "If we get the customers, yeah," he says. "If we have an anchor customer, we would gladly build another building."

From the NOC to the Dock

Before the data center tour starts in earnest, we poke our heads into the BOS-1 Network Operations Center. Inside the NOC, mounted on the wall facing us is, in strictly technical parlance, a much larger than average video wall.

"We're always adding, as we create more dead spots and spaces, we'll add more cameras to pick up footage," explains Brunetti. "They hit 24 hours a day, seven days a week. We run our staff to cover the 7:00 a.m. to 7:00 p.m. slot, then anytime after that, it's an on-call situation. Most of the guys here live within 15 minutes, so they could be here pretty quick if there's anything that they need to do."

The BOS-1 Data Center's roster of compliance includes certifications for: NIST 800-53; FISMA HIGH; FedRAMP; CMMC; SOC 2 Type II, SOC 3; PCI-DSS; HIPAA; HITRUST; LEED Gold; ISO 9001, 27001, 50001, 14001, 22301; and Uptime Institute Tier 3 Certification.

"Compliance is a huge thing with our customers," notes Brunetti, adding, "We are constantly being audited. It's something that we're used to. We're really good at it. We never have any issues here. A couple of times a year, someone's coming into audit, and our customers get audited as well. So then we've got to supply most of the information to them too. We're just so used to it - and the customers like that."

We pass through the facility's loading dock area and several nearby work bays. Spare filters for the rooftop air handling units are stacked in a storage area. "We debox here," says Brunetti. "We don't allow cardboard in the data center, so a lot of the boxes come off right here. In this other workspace, they can bring a rack down and fully load the rack in a nice comfortable space."

"I have three guys on-site in my ops team," he continues. "One is a master electrician and the other two have been in communications for probably almost 30 years combined. We do a lot of stuff ourselves. If we can't handle it ourselves, if we're too busy, we'll have a contractor come in, obviously, but we try our best to save money where we can."

BOS-1 Notes: Power and Cooling, Fire Suppression, Lighting, Generators, Supply Chain

Designed as Boston’s first-ever LEED Gold multi-tenant data center, Iron Mountain BOS-1 employs waterless cooling to save millions of gallons of water annually, and uses 100% renewable power.

"When the building was built, we were a bit ahead of the game," notes Brunetti. "It's all about redundancy. We're N+1 for our AC systems, for the generators. We have two humidifiers. We've pretty much got an extra one of everything. It's diesel generators for backup power and a UPS system to pick up that in between time."

The local power utility serving the facility is National Grid. "They're actually a couple doors down from us," says Brunetti. "That's their national call center, which makes this a great neighborhood to be in. We're the first ones to come back up with power in a storm if they do go down. So this was a great place for Iron Mountain to build a data center."

"They put this building together in about seven or eight months," he adds. "If we did have another customer, we would add another building onto the side and the hallways would match up. And we already have all the taps in place as part of the original construction so it wouldn't interfere with customer uptime. You wouldn't have to shut down the building to add a tap to the next building. It's ready to go."

The BOS-1 Data Center's strategic location in a leading FLAP [Frankfurt, London, Amsterdam and Paris] market also offers access to 37.6 total MW of renewable power.

"We have the Starline system, which is a huge highlight of our data center," adds Brunetti. "The Starline allows us to add and remove power easily. We can change it out. If we had a customer that needed to upgrade their power, we could simply add a power disconnect right next to their old one, and they can at their leisure swap over to their new power plan. Then once they're done, we can go in and remove that without any downtime.

The Iron Mountain BOS-1 Data Center's fire riser room also contains the facility's lighting rack and security system. The immaculately dressed lighting rack employs Power over Ethernet connectivity with Category network cabling powering the facility's every light.

"This system here saves the building I think almost 60% on power compared to a traditionally cabled building," says Brunetti. "If all your lights are running on 110, you're sucking up a lot of power. All these are LED, I can control every light, I can make them dim to whatever levels, it's completely customizable. We save a lot of money with that. It's another LEED Gold thing."

We enter the data center's MEP [mechanical, electrical, and plumbing] yard to inspect the facility's two megawatt diesel generators, one on each side of the yard. Housed in heated hurricane shelters with 4000 gallon tanks, a full load of fuel accounts for about 72 hours of runtime.

"They are quarterly maintained," adds Brunetti. "We run them before storms during the winter. These are heated and closed." There's one giant fan at the backside of the enclosure. Louvers on either end open up when the generator operates, as air flows through to cool the entire engine.

Outside the generator shelter, my attention is directed to where BOS-1's four rooftop AC units employ compressors that use a sealed fluid system for cooling. These 120 ton packaged air handlers are Trane and Tele Pack units.

Brunetti observes, "In a data center, we're trying to conserve power. We're not trying to light up all four compressors if we don't need them. So we've added a VFD [variable frequency drive]; what that allows us to do is light up one, two, three or four compressors, whatever the building calls for. This is also N+1. If I had a full load in the building, I would be running three units or four units at 75%. That way if I needed to work on one, I could put the entire building load on three units and work. On the other one. Work on the one you just shut down."

I asked Brunetti about supply chain issues throughout the pandemic up until now.

"It's almost back to normal - almost," he says. "The high priced items still take a long time to get. Like if I needed to get a monstrous water pump or something like that, it would take over a year. I know a couple of my counterparts throughout [Iron Mountain] have had issues getting large parts like cooling towers. And it's like a two year wait for a generator and stuff like that, the big stuff."

Inside the BOS-1 Data Hall

The BOS-1 data hall sits behind plenum walls on a three foot deep raised floor. The data center serves customers from Massachusetts, Rhode Island, New Hampshire and Connecticut. "For Boston, it may not be big for any other data center, but a big build out would be something like this, which is like 16 cabinets up to 30 or 40 cabinets," says Brunetti. "Those are pretty large build outs for us here in Boston."

Pointing to a plexiglass wall over the aisle, Brunetti explains, "We have some cooling modifications like this that'll take the PUE down a little bit, because that holds that cold air in there, so that the devices can pull that air through and use it, rather than that cold air just kind of getting sprayed all over the room and wasted. The AC units push the air into vents that go down that wall, then it goes below the floor. We pick and choose where we want the air to come up."

Directing my attention to the building monitoring system console on the wall, Brunetti tells an anecdote:

"That BMS actually will help us regulate the temperature in the data center. When I was in construction, working in data centers, they had them at 60 degrees, and their thought was that if it was at 60 degrees instead of 70, they've got ten degrees of time to work on whatever's broken and get it back up. They thought that would get them some time. It actually costs you more money to keep it at 60 degrees, than it helps you in time. It really only gives you like an extra minute. That's why you've got these redundancies nowadays. They were paying God knows what money to keep it it at 60 degrees all the time. I used to have to wear a hoodie to work in these places. Nowadays, close to 68, 70 degrees is fine - it's that nice, happy medium we have."

Interconnection and Cloud Factors

Iron Mountain BOS-1 hosts a diverse NSP selection for easy, in-house cloud and carrier access. A concurrently maintainable facility, the data center houses four carriers on-site, while providing metro access to the AWS and Azure clouds and WAN connectivity options.

"We have two POP rooms - primary point of entrance and a secondary point of entrance," says Brunetti. "Any of the circuits coming into this building from outside, most of the ones that come from the POP are going to take a left down the road. The ones coming from our S-POP are going to take a right down the road and it's all a different route. So that's redundancy, we've got two different routes. If there's ever an issue with one side, the other side can pick it up. Even our route back to Boston to pick up all of our vendors, that is redundant as well."

I ask whether BOS-1 would ever feature to run any type of AI workload.

In Brunetti's opinion, " You wouldn't expect to see it here, not here in New England, I don't think. AI takes just so much compute power. I think it's just too expensive for all that compute. It's such a large array that you need for all that AI power. I haven't seen it yet. They'd stick it in Virginia or one of those data centers."

Then Brunetti holds forth with this interesting insight:

I'm not in charge of this stuff, so I just see it, but for some builds, they have two data centers and one's a backup and one's the main data center. And then you'll have another company that's got two data centers and they're working them both at halfway, so then if one failed, it would all go over to the other side. Then you've got people that depend on two data centers, and they've got also cloud dependencies as well. They're depending on AWS to run some things, and they expect AWS to never go down, because they've got just so many machines. It's funny how they all have different approaches to their redundancies and what they want to protect.

Offering wholesale and retail colocation options, BOS-1's list of customer services includes: on-site smart hands; cloud on-ramp; cross connect; IP transit; metro wave; peering, and private network transport capabilities. The facility provides low-latency cloud and carrier links via the nearby 1 Summer Street strategic interconnection hub and carrier hotel.

One to Learn On

As we exited the Iron Mountain BOS-1 data hall, Brunetti told me:

This is a great data center to learn in because it's really small. I had a rookie guy about a year and a half ago, and the kid's an expert now. That's a lot harder to do in a large data center where you've got a million things going on, and you generally throw the rookie guy on the easy stuff. He's learned everything, a seasoned vet already. Everything's so hands-on and there's only three guys. Everybody helps everybody out. It's a cliché thing to say, but we're a pretty tight team.

JLL's H1 2023 North American Data Center Report warns that lack of presence from hyperscalers makes new data center development in the Boston data center market unlikely in the near future. JLL's report put the Boston market's total inventory at 1,200,000 SF and 160 MW, with 225,000 SF and 22 MW vacancy. Zero was the number of square feet and megawatts of supply under construction and planned for the market, at the time of the report.

JLL said that demand in the Boston data center market remains modest but steady, with most local firms coming from the health, biotech, and financial services industries. While space is available in all submarkets (City, Rt. 128 and Rt. 495), absorption continues to be strongest in the City submarket, due to higher connectivity, and the 495 submarket due to lower cost. Energy cost and availability are cited as the most significant challenges for data center providers in the Boston market.

Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, and signing up for our weekly newsletters using the form below.

About the Author

Matt Vincent

A B2B technology journalist and editor with more than two decades of experience, Matt Vincent is Editor in Chief of Data Center Frontier.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

RachenStocker/Shutterstock.com
Source: RachenStocker/Shutterstock.com

Views from the Ground: Electrical Contractors Building Data Centers Weigh In

David Pala from IBEW Local 26 shares insights on industry changes and the benefits of being a union electrician.

White Papers

Mgk Dcf Wp Cover1 2023 01 09 10 34 33

Data Center Microgrids: The Case for Microgrids at Data Centers

Jan. 9, 2023
Many of the systems that businesses and the public rely on in the modern world are dependent on the internet, making data centers a critical form of infrastructure. But as the...